Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Goodness of Fit Test for Gumbel Distribution Based on Kullback-Leibler Information Using Several Different Estimators

In this paper, our objective is to test the statistical hypothesis : ( ) ( ) for all against : ( ) ( ) 1 H F x F x x H F x F x o o o = ≠ for some , x where ( ) F x o is a known distribution function. In this study, a goodness of fit test statistics for Gumbel distribution based on Kullback-Leibler information is studied. The performance of the test under simple random sampling is investigated u...

متن کامل

Gaussian Kullback-Leibler approximate inference

We investigate Gaussian Kullback-Leibler (G-KL) variational approximate inference techniques for Bayesian generalised linear models and various extensions. In particular we make the following novel contributions: sufficient conditions for which the G-KL objective is differentiable and convex are described; constrained parameterisations of Gaussian covariance that make G-KL methods fast and scal...

متن کامل

Goodness–of–fit Tests for the Inverse Gaussian Distribution Based on the Empirical Laplace Transform

This paper considers two flexible classes of omnibus goodness-of-fit tests for the inverse Gaussian distribution. The test statistics are weighted integrals over the squared modulus of some measure of deviation of the empirical distribution of given data from the family of inverse Gaussian laws, expressed by means of the empirical Laplace transform. Both classes of statistics are connected to t...

متن کامل

Alternative Kullback-Leibler information entropy for enantiomers.

In our series of studies on quantifying chirality, a new chirality measure is proposed in this work based on the Kullback-Leibler information entropy. The index computes the extra information that the shape function of one enantiomer carries over a normalized shape function of the racemate, while in our previous studies the shape functions of the R and S enantiomers were used considering one as...

متن کامل

Bootstrap Estimate of Kullback-leibler Information for Model Selection Bootstrap Estimate of Kullback-leibler Information for Model Selection

Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example to...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Korean Journal of Applied Statistics

سال: 2011

ISSN: 1225-066X

DOI: 10.5351/kjas.2011.24.6.1271